93 research outputs found

    Iterative criteria-based approach to engineering the requirements of software development methodologies

    Get PDF
    Software engineering endeavours are typically based on and governed by the requirements of the target software; requirements identification is therefore an integral part of software development methodologies. Similarly, engineering a software development methodology (SDM) involves the identification of the requirements of the target methodology. Methodology engineering approaches pay special attention to this issue; however, they make little use of existing methodologies as sources of insight into methodology requirements. The authors propose an iterative method for eliciting and specifying the requirements of a SDM using existing methodologies as supplementary resources. The method is performed as the analysis phase of a methodology engineering process aimed at the ultimate design and implementation of a target methodology. An initial set of requirements is first identified through analysing the characteristics of the development situation at hand and/or via delineating the general features desirable in the target methodology. These initial requirements are used as evaluation criteria; refined through iterative application to a select set of relevant methodologies. The finalised criteria highlight the qualities that the target methodology is expected to possess, and are therefore used as a basis for de. ning the final set of requirements. In an example, the authors demonstrate how the proposed elicitation process can be used for identifying the requirements of a general object-oriented SDM. Owing to its basis in knowledge gained from existing methodologies and practices, the proposed method can help methodology engineers produce a set of requirements that is not only more complete in span, but also more concrete and rigorous

    Safe Specification of Operator Precedence Rules

    Get PDF
    International audienceIn this paper we present an approach to specifying opera- tor precedence based on declarative disambiguation constructs and an implementation mechanism based on grammar rewriting. We identify a problem with existing generalized context-free parsing and disambigua- tion technology: generating a correct parser for a language such as OCaml using declarative precedence specification is not possible without resorting to some manual grammar transformation. Our approach provides a fully declarative solution to operator precedence specification for context-free grammars, is independent of any parsing technology, and is safe in that it guarantees that the language of the resulting grammar will be the same as the language of the specification grammar. We evaluate our new approach by specifying the precedence rules from the OCaml reference manual against the highly ambiguous reference grammar and validate the output of our generated parser

    OSSMETER: Automated measurement and analysis of open source software

    Get PDF
    International audienceDeciding whether an open source software (OSS) meets the requiredstandards for adoption in terms of quality, maturity, activity of development anduser support is not a straightforward process. It involves analysing various sourcesof information, including the project’s source code repositories, communicationchannels, and bug tracking systems. OSSMETER extends state-of-the-art techniquesin the field of automated analysis and measurement of open-source software(OSS), and develops a platform that supports decision makers in the processof discovering, comparing, assessing and monitoring the health, quality, impactand activity of opensource software. To achieve this, OSSMETER computestrustworthy quality indicators by performing advanced analysis and integrationof information from diverse sources including the project metadata, source coderepositories, communication channels and bug tracking systems of OSS projects

    Capture-Avoiding and Hygienic Program Transformations

    Get PDF
    Program transformations in terms of abstract syntax trees compromise referential integrity by introducing variable capture. Variable capture occurs when in the generated program a variable declaration accidentally shadows the intended target of a variable reference. Existing transformation systems either do not guarantee the avoidance of variable capture or impair the implementation of transformations. We present an algorithm called name-fix that automatically eliminates variable capture from a generated program by systematically renaming variables. name-fix is guided by a graph representation of the binding structure of a program, and requires name-resolution algorithms for the source language and the target language of a transformation. name-fix is generic and works for arbitrary transformations in any transformation system that supports origin tracking for names. We verify the correctness of name-fix and identify an interesting class of transformations for which name-fix provides hygiene. We demonstrate the applicability of name-fix for implementing capture-avoiding substitution, inlining, lambda lifting, and compilers for two domain-specific languages

    Measurement of the View the tt production cross-section using eÎŒ events with b-tagged jets in pp collisions at √s = 13 TeV with the ATLAS detector

    Get PDF
    This paper describes a measurement of the inclusive top quark pair production cross-section (σttÂŻ) with a data sample of 3.2 fb−1 of proton–proton collisions at a centre-of-mass energy of √s = 13 TeV, collected in 2015 by the ATLAS detector at the LHC. This measurement uses events with an opposite-charge electron–muon pair in the final state. Jets containing b-quarks are tagged using an algorithm based on track impact parameters and reconstructed secondary vertices. The numbers of events with exactly one and exactly two b-tagged jets are counted and used to determine simultaneously σttÂŻ and the efficiency to reconstruct and b-tag a jet from a top quark decay, thereby minimising the associated systematic uncertainties. The cross-section is measured to be: σttÂŻ = 818 ± 8 (stat) ± 27 (syst) ± 19 (lumi) ± 12 (beam) pb, where the four uncertainties arise from data statistics, experimental and theoretical systematic effects, the integrated luminosity and the LHC beam energy, giving a total relative uncertainty of 4.4%. The result is consistent with theoretical QCD calculations at next-to-next-to-leading order. A fiducial measurement corresponding to the experimental acceptance of the leptons is also presented

    Search for strong gravity in multijet final states produced in pp collisions at √s=13 TeV using the ATLAS detector at the LHC

    Get PDF
    A search is conducted for new physics in multijet final states using 3.6 inverse femtobarns of data from proton-proton collisions at √s = 13TeV taken at the CERN Large Hadron Collider with the ATLAS detector. Events are selected containing at least three jets with scalar sum of jet transverse momenta (HT) greater than 1TeV. No excess is seen at large HT and limits are presented on new physics: models which produce final states containing at least three jets and having cross sections larger than 1.6 fb with HT > 5.8 TeV are excluded. Limits are also given in terms of new physics models of strong gravity that hypothesize additional space-time dimensions

    Search for TeV-scale gravity signatures in high-mass final states with leptons and jets with the ATLAS detector at sqrt [ s ] = 13TeV

    Get PDF
    A search for physics beyond the Standard Model, in final states with at least one high transverse momentum charged lepton (electron or muon) and two additional high transverse momentum leptons or jets, is performed using 3.2 fb−1 of proton–proton collision data recorded by the ATLAS detector at the Large Hadron Collider in 2015 at √s = 13 TeV. The upper end of the distribution of the scalar sum of the transverse momenta of leptons and jets is sensitive to the production of high-mass objects. No excess of events beyond Standard Model predictions is observed. Exclusion limits are set for models of microscopic black holes with two to six extra dimensions

    Measurement of the correlation between flow harmonics of different order in lead-lead collisions at √sNN = 2.76 TeV with the ATLAS detector

    Get PDF
    Correlations between the elliptic or triangular flow coefficients vm (m=2 or 3) and other flow harmonics vn (n=2 to 5) are measured using √sNN=2.76 TeV Pb+Pb collision data collected in 2010 by the ATLAS experiment at the LHC, corresponding to an integrated luminosity of 7 ÎŒb−1. The vm−vn correlations are measured in midrapidity as a function of centrality, and, for events within the same centrality interval, as a function of event ellipticity or triangularity defined in a forward rapidity region. For events within the same centrality interval, v3 is found to be anticorrelated with v2 and this anticorrelation is consistent with similar anticorrelations between the corresponding eccentricities, Δ2 and Δ3. However, it is observed that v4 increases strongly with v2, and v5 increases strongly with both v2 and v3. The trend and strength of the vm−vn correlations for n=4 and 5 are found to disagree with Δm−Δn correlations predicted by initial-geometry models. Instead, these correlations are found to be consistent with the combined effects of a linear contribution to vn and a nonlinear term that is a function of v22 or of v2v3, as predicted by hydrodynamic models. A simple two-component fit is used to separate these two contributions. The extracted linear and nonlinear contributions to v4 and v5 are found to be consistent with previously measured event-plane correlations

    Search for dark matter produced in association with a hadronically decaying vector boson in pp collisions at sqrt (s) = 13 TeV with the ATLAS detector

    Get PDF
    A search is presented for dark matter produced in association with a hadronically decaying W or Z boson using 3.2 fb−1 of pp collisions at View the MathML sources=13 TeV recorded by the ATLAS detector at the Large Hadron Collider. Events with a hadronic jet compatible with a W or Z boson and with large missing transverse momentum are analysed. The data are consistent with the Standard Model predictions and are interpreted in terms of both an effective field theory and a simplified model containing dark matter

    Search for Supersymmetry in Di-Photon Final States at sqrt{s} = 1.96 TeV

    Get PDF
    We report results of a search for supersymmetry (SUSY) with gauge-mediated symmetry breaking in di-photon events collected by the D0 experiment at the Fermilab Tevatron Collider in 2002--2006. In 1.1 fb−1^{-1} of data, we find no significant excess beyond the background expected from the standard model and set the most stringent lower limits to date for a standard benchmark model on the lightest neutralino and chargino masses of 125 GeV and 229 GeV, respectively, at 95% confidence
    • 

    corecore